graph stochastic neural network
Graph Stochastic Neural Networks for Semi-supervised Learning
Graph Neural Networks (GNNs) have achieved remarkable performance in the task of the semi-supervised node classification. However, most existing models learn a deterministic classification function, which lack sufficient flexibility to explore better choices in the presence of kinds of imperfect observed data such as the scarce labeled nodes and noisy graph structure. To improve the rigidness and inflexibility of deterministic classification functions, this paper proposes a novel framework named Graph Stochastic Neural Networks (GSNN), which aims to model the uncertainty of the classification function by simultaneously learning a family of functions, i.e., a stochastic function. Specifically, we introduce a learnable graph neural network coupled with a high-dimensional latent variable to model the distribution of the classification function, and further adopt the amortised variational inference to approximate the intractable joint posterior for missing labels and the latent variable. By maximizing the lower-bound of the likelihood for observed node labels, the instantiated models can be trained in an end-to-end manner effectively. Extensive experiments on three real-world datasets show that GSNN achieves substantial performance gain in different scenarios compared with stat-of-the-art baselines.
Graph Stochastic Neural Networks for Semi-supervised Learning: Supplemental Material Haibo Wang
The supplemental material includes the following contents. Actually, Eq. (11) can also be derived from Eq. (4) rigorously. The pseudo-code of GSNN is in Algorithm 1. The detailed statistics of three datasets used in this paper are listed in Table 1. In this paper, when evaluating the performance in the standard experimental scenario and in the label-scarce scenario, we compare with six state-of-the-art baselines used for graph-based semi-supervised learning.
- Information Technology > Artificial Intelligence > Machine Learning > Unsupervised or Indirectly Supervised Learning (0.63)
- Information Technology > Artificial Intelligence > Machine Learning > Inductive Learning (0.63)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.53)
Review for NeurIPS paper: Graph Stochastic Neural Networks for Semi-supervised Learning
Weaknesses: This paper combines latent variable models with GNNs, it's not novel enough and there are many previous works with similar ideas in graph generation. The difference is that the formulation of this paper is more like a conditional generative model and targets at node classification tasks. Based on the implementation of the method, I think the model is similar to RGCN in some aspects. Undoubtedly, there are differences that the model does not directly learn a Gaussian representation but instead samples from a Gaussian latent variable and concatenates it with the features of the node. However, both aim to inject some noise and in essence decrease the information between the representation and the original node feature so that the model only captures the key attributes and thus making the model more robust than vanilla GNNs.
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.42)
- Information Technology > Artificial Intelligence > Machine Learning > Unsupervised or Indirectly Supervised Learning (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Inductive Learning (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Unsupervised or Indirectly Supervised Learning (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Inductive Learning (0.40)
Graph Stochastic Neural Networks for Semi-supervised Learning
Graph Neural Networks (GNNs) have achieved remarkable performance in the task of the semi-supervised node classification. However, most existing models learn a deterministic classification function, which lack sufficient flexibility to explore better choices in the presence of kinds of imperfect observed data such as the scarce labeled nodes and noisy graph structure. To improve the rigidness and inflexibility of deterministic classification functions, this paper proposes a novel framework named Graph Stochastic Neural Networks (GSNN), which aims to model the uncertainty of the classification function by simultaneously learning a family of functions, i.e., a stochastic function. Specifically, we introduce a learnable graph neural network coupled with a high-dimensional latent variable to model the distribution of the classification function, and further adopt the amortised variational inference to approximate the intractable joint posterior for missing labels and the latent variable. By maximizing the lower-bound of the likelihood for observed node labels, the instantiated models can be trained in an end-to-end manner effectively. Extensive experiments on three real-world datasets show that GSNN achieves substantial performance gain in different scenarios compared with stat-of-the-art baselines.